Back to top

Image: Bigstock

Bull of the Day: IBM (IBM)

Read MoreHide Full Article

Key Takeaways

  • {\"0\":\"IBM watsonx.ai studio offers new foundation models, genAI, and a governance toolkit for enterprise\",\"1\":\"IBM Concert platform allows comprehensive orchestration of data, security, and AI workflows\",\"2\":\"IBM Quantum roadmap charts path to quantum advantage in 2026\"}

Since one historic American technology company, Intel ((INTC - Free Report) ), made big news last week, I wanted to profile another stalwart I've been cheering for.

International Business Machines (
(IBM - Free Report) ) is currently a Zacks #2 Rank because earnings estimates continue to trend higher, with $12 EPS projected for next year and nearly $13 for 2027.

Granted this is not the double-digit growth we see in Mag 7 names associated with AI innovation and productivity. But IBM is still growing sales in the mid-single digits and is projected to hit $70 billion topline next year, make shares trade at just over 3X sales.

More importantly, ole' "Big Blue" is still highly relevant in the halls and labs of deep research in Silicon Valley and beyond.

The IBM Innovation Way Persists

And the best part of being an IBM investor with that kind of value is that we know the company will continue to be at the bleeding edges of technology innovation.

Here's what I wrote in a recent update to my quantum computing report Beyond AI: The Quantum Leap in Computing Power...

Despite IBM's lack of growth on par with the Mag 7, it remains a deep technology R&D powerhouse. And they just unveiled their 10-year roadmap for QC innovation (just search "IBM quantum roadmap").

IBM’s approach to quantum computing centers on building scalable, fault-tolerant quantum systems with industry-leading hardware and software developments. Originally, their strategic focus was on increasing qubit counts; recently, however, IBM has shifted toward enhancing gate quality and error correction, integrating quantum and classical computing through quantum-centric supercomputing architectures. This transition aims to enable practical quantum applications and maximize resource efficiency by leveraging concurrent execution in both quantum and classical domains.

IBM’s quantum platform is accessible via the upgraded IBM Quantum Platform, which connects users to real quantum computers along with comprehensive documentation and learning resources. The technical backbone features their modular Quantum System Two, which utilizes multiple Heron processors—each with significant hardware advances like tunable couplers and reduced noise. IBM’s quantum middleware automates distribution of workloads between quantum and classical systems, optimizing performance through serverless tools and intelligent orchestration of quantum processing units (QPUs).

As of mid-2025, IBM’s latest innovations include modular quantum supercomputers such as Starling and the projected Blue Jay, aiming to integrate thousands of logical qubits for scalable computation. Their novel middleware provides advanced noise correction and automated resource allocation. The unveiling of the Condor chip—featuring 1,121 superconducting qubits—marked a breakthrough for high-density architectures and long-term reliability. Initiatives like the National Quantum Algorithm Center further cement IBM’s leadership, driving algorithmic research with next-generation quantum hardware.

(end of excerpt from my Quantum Computing report and if you'd like a copy, just email Ultimate@Zacks.com and tell 'em Cooker sent you)

Can IBM's Focus on Innovative AI Solutions Spur an Uptrend?

Last week, my colleague Supriyo Bose wrote the above linked article about how IBM is deploying watsonx to advance smaller, domain-focused AI models centered on reliability and enterprise utility. Here are some excerpts...

IBM is focusing more on smaller, domain-focused artificial intelligence (AI) models, emphasizing reliability, cost-efficiency and practical enterprise utility. IBM’s watsonx platform has been the core technology platform for its AI capabilities. It delivers the value of foundational models to the enterprise, enabling them to be more productive.

This enterprise-ready AI and data platform comprises three products to help organizations accelerate and scale AI — the watsonx.ai studio for new foundation models, generative AI and machine learning, the watsonx.data fit-for-purpose data store built on an open lake house architecture and the watsonx.governance toolkit to help enable AI workflows to be built with responsibility and transparency.

The IBM Concert solution is an AI-powered automation solution that offers intelligent resilience for complex IT operations such as patch management and the orchestration of security-related activities. It brings together all relevant data and specializations for AI-driven recommendations and workflows. Leveraging Generative AI, IBM Concert develops an optimized and prioritized patching plan by creating contextual information about system topology and business requirements that enables end-to-end AI-powered automation.  

(end of excerpts from Bose article)

It was also nice to read in his piece that IBM is still forging collaboration with companies like Salesforce (
(CRM - Free Report) ). I'll never forget the week in March of 2017 when I had planned to teach investors how a "massively parallel architecture" (MPA) was the GPU-driven infrastructure of machine learning and deep learning.

Because that week not only did Big Blue unleash a new supercomputer, but IBM Watson and Salesforce Einstein got together to announce "a global strategic partnership to deliver joint solutions designed to leverage artificial intelligence and enable companies to make smarter decisions, faster than ever before."

Here's what I wrote in Get Your "MPA" in Deep Learning...

Big happenings in AI land recently as new supercomputers get built and Einstein taps Watson's brain. Both events were announced on March 6, 2017 and old Big Blue, IBM, finds itself at the center of all of this, still as relevant as the day Deep Blue beat Garry Kasparov in 1996.

The supercomputer comes from Fujitsu who is using 24 NVIDIA (
(NVDA - Free Report) ) DGX-1 AI systems to help build a supercomputer for RIKEN, Japan’s largest comprehensive research institution, for deep learning research. Scheduled to go online in April 2017, the RIKEN Center for Advanced Intelligence Project will use the new system as a platform to accelerate R&D into AI technology.

The largest customer installation of DGX-1 systems to date, the supercomputer will accelerate the application of AI to solve complex challenges in healthcare, manufacturing and public safety.

“DGX-1 is like a time-machine for AI researchers,” said Jen-Hsun Huang, founder and CEO of NVIDIA. “Enterprises, research centers and universities worldwide are adopting DGX-1 to ride the wave of deep learning — the technology breakthrough at the center of the AI revolution.”

(end of excerpts from my "MPA" article)

Those four opening paragraphs are symbolic of two things. First, it was still the very early stages of the AI revolution, as you can multiply for yourself: 24 DGX-1 systems each held 8 Tesla P100 GPUs (192 GPUs total) with 21 billion transistors each.

Now Tesla, OpenAI, Meta Platforms (
(META - Free Report) ) and the hyperscale cloud providers are buying GPUs 100,000 at a time and the new Blackwell architecture houses 208 billion transistors per accelerator unit!

The second symbolism here is personal for me. While I had begun learning about big data, automation, and machine learning in 2016, my knowledge curve really began to accelerate when I started listening to every word that came from Jensen's mouth, from GTC events to product and partner blogs.

"DGX-1 is like a time-machine for AI researchers" really stuck with me. I think he meant it in multiple ways: expand time, crunch time, and time travel both to the dawn of astrophysics and evolution... and to the future of space travel and extended health-spans.

And I realized Jensen was creating power tools for scientists, engineers, and other data crunchers... that would rapidly change the world as they accelerated innovation and knowledge exponentially.

Little did I know that later that summer, European Space Agency engineers would be using a cluster of 4,000 NVIDIA GPUs to model and simulate 25 billion galaxies in order to train the Euclid space telescope for its mission to map the universe and collect data on dark matter.

Since then, NVIDIA "power tools" are helping scientists and engineers cure disease, advance the energy grid, and invent new materials. And IBM is still working behind the scenes to show them how as a key NVIDIA partner in the multi-trillion-dollar IT infrastructure transition to enterprise "AI factories."

Bottom line: IBM shares are currently in a bearish trend since their Q2 report. But earnings estimates have only gone up. I would look for long opportunities near $225.

Published in